On almost Linearity of Low Dimensional Projections from High Dimensional Data

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Eecient Recovery of Low-dimensional Structure from High-dimensional Data

Many modeling tasks in computer vision. e.g. structure from motion, shape/reeectance from shading , lter synthesis have a low-dimensional intrinsic structure even though the dimension of the input data can be relatively large. We propose a simple but surprisingly eeective iterative randomized algorithm that drastically cuts down the time required for recovering the intrinsic structure. The comp...

متن کامل

Efficient Recovery of Low-Dimensional Structure from High-Dimensional Data

Many modeling tasks in computer vision. e.g. structure from motion, shape/re ectance from shading, lter synthesis have a low-dimensional intrinsic structure even though the dimension of the input data can be relatively large. We propose a simple but surprisingly e ective iterative randomized algorithm that drastically cuts down the time required for recovering the intrinsic structure. The compu...

متن کامل

Manual Controls For High-Dimensional Data Projections

Projections of high-dimensional data onto low-dimensional subspaces provide insightful views for understanding multivariate relationships. In this paper we discuss how to manually control the variable contributions to the projection. The user has control of the way a particular variable contributes to the viewed projection and can interactively adjust the variable's contribution. These manual c...

متن کامل

Learning from High Dimensional fMRI Data using Random Projections

The term “the Curse of Dimensionality” refers to the difficulty of organizing and applying machine learning to data in a very high dimensional space. The reason for this difficulty is that as the dimensionality increases, the volume between different training examples increases rapidly and the data becomes sparse and difficult to classify. So, the predictive power of a machine learning algorith...

متن کامل

High-Dimensional Principal Projections

The Principal Component Analysis (PCA) is a famous technique from multivariate statistics. It is frequently carried out in dimension reduction either for functional data or in a high dimensional framework. To that aim PCA yields the eigenvectors (φ̂i)i of the covariance operator of a sample of interest. Dimension reduction is obtained by projecting on the eigenspaces spanned by the φ̂i’s usually ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 1993

ISSN: 0090-5364

DOI: 10.1214/aos/1176349155